An Analysis Framework for Landmark Selection and Nyström Kernel Approximation: Bounds, Algorithms, and Applications

ثبت نشده
چکیده

In recent years, the spectral analysis of appropriately defined kernels has emerged as a principled way to extract the nonlinear structure inherent in vision tasks as diverse as segmentation and recognition. For computational reasons, a landmark selection process is often employed to choose a partial kernel, followed by a Nyström extension that provides its minimally positive definite completion. The literature remains open on the question of optimal landmark selection, however, and here we develop an analysis framework for this problem that subsumes previous approaches. We first show how it leads to quantitative performance bounds for both existing and new algorithms. We then discuss a range of methods for optimizing the landmark selection process through a stochastic procedure, and finally demonstrate their effectiveness in two key applications: the low-level vision task of image segmentation, and the high-level task of nonlinear dimensionality reduction from video data.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On landmark selection and sampling in high-dimensional data analysis

In recent years, the spectral analysis of appropriately defined kernel matrices has emerged as a principled way to extract the low-dimensional structure often prevalent in high-dimensional data. Here, we provide an introduction to spectral methods for linear and nonlinear dimension reduction, emphasizing ways to overcome the computational limitations currently faced by practitioners with massiv...

متن کامل

On landmark selection and sampling in high-dimensional data analysis BY MOHAMED-ALI BELABBAS

In recent years, the spectral analysis of appropriately defined kernel matrices has emerged as a principled way to extract the low-dimensional structure often prevalent in highdimensional data. Here, we provide an introduction to spectral methods for linear and nonlinear dimension reduction, emphasizing ways to overcome the computational limitations currently faced by practitioners with massive...

متن کامل

Fast Prediction for Large-Scale Kernel Machines

Kernel machines such as kernel SVM and kernel ridge regression usually construct high quality models; however, their use in real-world applications remains limited due to the high prediction cost. In this paper, we present two novel insights for improving the prediction efficiency of kernel machines. First, we show that by adding “pseudo landmark points” to the classical Nyström kernel approxim...

متن کامل

SPSD Matrix Approximation vis Column Selection: Theories, Algorithms, and Extensions

Symmetric positive semidefinite (SPSD) matrix approximation is an important problem with applications in kernel methods. However, existing SPSD matrix approximation methods such as the Nyström method only have weak error bounds. In this paper we conduct in-depth studies of an SPSD matrix approximation model and establish strong relative-error bounds. We call it the prototype model for it has mo...

متن کامل

The Modified Nystrom Method: Theories, Algorithms, and Extension

Symmetric positive semidefinite (SPSD) matrix approximation is an important problem with applications in kernel methods. However, existing SPSD matrix approximation methods such as the Nyström method only have weak error bounds. In this paper we conduct in-depth studies of an SPSD matrix approximation model and establish strong relative-error bounds. We call it the prototype model for it has mo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008